Optimal Controllers with Complex Order Derivatives
نویسندگان
چکیده
منابع مشابه
Design of Optimal Low-Order Feedforward Controllers
Design rules for optimal feedforward controllers with lead-lag structure in the presence of measurable disturbances are presented. The design rules are based on stable firstorder models with time delays, FOTD, and are optimal in the sense of minimizing the integratedsquared error. The rules are derived for an open-loop setting, considering a step disturbance. This paper also discusses a general...
متن کاملLocally optimal controllers and globally inverse optimal controllers
In this paper we consider the problem of global asymptotic stabilization with prescribed local behavior. We show that this problem can be formulated in terms of control Lyapunov functions. Moreover, we show that if the local control law has been synthesized employing a LQ approach, then the associated Lyapunov function can be seen as the value function of an optimal problem with some specific l...
متن کاملTheory of Hybrid Fractional Differential Equations with Complex Order
We develop the theory of hybrid fractional differential equations with the complex order $thetain mathbb{C}$, $theta=m+ialpha$, $0<mleq 1$, $alphain mathbb{R}$, in Caputo sense. Using Dhage's type fixed point theorem for the product of abstract nonlinear operators in Banach algebra; one of the operators is $mathfrak{D}$- Lipschitzian and the other one is completely continuous, we prove the exis...
متن کاملOn reduced order controllers
We investigate here how some results of Hanzon and Ober [6] can be used to obtain reduced order controllers. The idea is to apply the principle of Nehari extension, which generates optimally robust stabilizing controllers, to functions which have multiple maximal singular values. It is well known that in this case the controller has a drop in degree. Using the parametrization of [6], we obtain ...
متن کاملSecond Order Derivatives for Network Pruning: Optimal Brain Surgeon
We investigate the use of information from all second order derivatives of the error function to perform network pruning (i.e., removing unimportant weights from a trained network) in order to improve generalization and increase the speed of further training. Our method, Optimal Brain Surgeon (OBS), is significantly better than magnitude-based methods, which can often remove the wrong weights. ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Journal of Optimization Theory and Applications
سال: 2012
ISSN: 0022-3239,1573-2878
DOI: 10.1007/s10957-012-0169-4